Distances Between Probability Distributions of Different Dimensions

نویسندگان

چکیده

Comparing probability distributions is an indispensable and ubiquitous task in machine learning statistics. The most common way to compare a pair of Borel measures compute metric between them, by far the widely used notions are Wasserstein total variation metric. next divergence this case almost every known divergences such as those Kullback–Leibler, Jensen–Shannon, Rényi, many more, special cases $f$ -divergence. Nevertheless these metrics may only be computed, fact, defined, when on spaces same dimension. How would one quantify, say, KL-divergence uniform distribution interval [−1, 1] Gaussian notation="LaTeX">$\mathbb {R}^{3}$ ? We show that give rise natural distances defined different dimensions, e.g., {R}^{m}$ another {R}^{n}$ where notation="LaTeX">$m, n$ distinct, so meaningful answer previous question.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Distances between probability distributions via characteristic functions and biasing

In a spirit close to classical Stein’s method, we introduce a new technique to derive first order ODEs on differences of characteristic functions. Then, using concentration inequalities and Fourier transform tools, we convert this information into sharp bounds for the so-called smooth Wasserstein metrics which frequently arise in Stein’s method theory. Our methodolgy is particularly efficient w...

متن کامل

Schubert Varieties and Distances between Subspaces of Different Dimensions

We resolve a basic problem on subspace distances that often arises in applications: How can the usual Grassmann distance between equidimensional subspaces be extended to subspaces of different dimensions? We show that a natural solution is given by the distance of a point to a Schubert variety within the Grassmannian. This distance reduces to the Grassmann distance when the subspaces are equidi...

متن کامل

On a class of perimeter-type distances of probability distributions

The class If , p G ( l ,oo] , of /-divergences investigated in this paper generalizes an /-divergence introduced by the author in [9] and applied there and by Reschenhofer and Bomze [11] in different areas of hypotheses testing. The main result of the present paper ensures that, for every p € (1, oo), the square root of the corresponding divergence defines a distance on the set of probability d...

متن کامل

On the distances between probability density functions

We give estimates of the distance between the densities of the laws of two functionals F and G on the Wiener space in terms of the Malliavin-Sobolev norm of F −G. We actually consider a more general framework which allows one to treat with similar (Malliavin type) methods functionals of a Poisson point measure (solutions of jump type stochastic equations). We use the above estimates in order to...

متن کامل

Distances between Distributions: Comparing Language Models

Language models are used in a variety of fields in order to support other tasks: classification, next-symbol prediction, pattern analysis. In order to compare language models, or to measure the quality of an acquired model with respect to an empirical distribution, or to evaluate the progress of a learning process, we propose to use distances based on the L2 norm, or quadratic distances. We pro...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Information Theory

سال: 2022

ISSN: ['0018-9448', '1557-9654']

DOI: https://doi.org/10.1109/tit.2022.3148923